Goto

Collaborating Authors

 correction distribution




Review for NeurIPS paper: Replica-Exchange Nos\'e-Hoover Dynamics for Bayesian Learning on Large Datasets

Neural Information Processing Systems

Summary and Contributions: The paper considers the problem of sampling from the posterior distribution in Bayesian inference. To be more precise, the paper approaches the question of stochastic sampling that relies only on minibatches of data at each iteration. To achieve rapid mixing between isolated modes, the authors consider parallel tempered chains and introduce replica-exchange steps into the stochastic Nose-Hoover Dynamics. The crux of this approach is the stochastic test for the replica-exchange step. To develop such a test, the authors follow the paper [An efficient minibatch acceptance test for metropolis-hastings], which introduces the concept of correction distribution.


Differentially Private Markov Chain Monte Carlo

Heikkilä, Mikko A., Jälkö, Joonas, Dikmen, Onur, Honkela, Antti

arXiv.org Machine Learning

Recent developments in differentially private (DP) machine learning and DP Bayesian learning have enabled learning under strong privacy guarantees for the training data subjects. In this paper, we further extend the applicability of DP Bayesian learning by presenting the first general DP Markov chain Monte Carlo (MCMC) algorithm whose privacy-guarantees are not subject to unrealistic assumptions on Markov chain convergence and that is applicable to posterior inference in arbitrary models. Our algorithm is based on a decomposition of the Barker acceptance test that allows evaluating the R\'enyi DP privacy cost of the accept-reject choice. We further show how to improve the DP guarantee through data subsampling and approximate acceptance tests.